Explaining Bagging with Monte Carlo Theory

نویسندگان

  • Roberto Esposito
  • Lorenza Saitta
چکیده

In this paper we propose the use of the framework of Monte Carlo stochastic algorithms to analyze ensemble learning, specifically, bagging. In particular, this framework allows one to explain bagging’s behavior and also why increasing the margin improves performances. Experimental results support the theoretical analysis.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Monte Carlo Theory as an Explanation of Bagging and Boosting

In this paper we propose the framework of Monte Carlo algorithms as a useful one to analyze ensemble learning. In particular, this framework allows one to guess when bagging wil l be useful, explains why increasing the margin improves performances, and suggests a new way of performing ensemble learning and error estimation.

متن کامل

On Bagging and Estimation in Multivariate Mixtures

Two bagging approaches, say 1 2 n-out-of-n without replacement (subagging) and n-out-of-n with replacement (bagging) have been applied in the problem of estimation of the parameters in a multivariate mixture model. It has been observed by Monte Carlo simulations and a real data example, that both bagging methods have improved the standard deviation of the maximum likelihood estimator of the mix...

متن کامل

Bagging Binary Predictors for Time Series

Bootstrap aggregating or Bagging, introduced by Breiman (1996a), has been proved to be effective to improve on unstable forecast. Theoretical and empirical works using classification, regression trees, variable selection in linear and non-linear regression have shown that bagging can generate substantial prediction gain. However, most of the existing literature on bagging have been limited to t...

متن کامل

Bagging During Markov Chain Monte Carlo for Smoother Predictions

Predictions Herbert K. H. Lee University of California, Santa Cruz Abstract: Making good predictions from noisy data is a challenging problem. Methods to improve the robustness of predictions include bagging and Bayesian shrinkage approaches. These methods can be gainfully combined by doing bootstrap resampling during Markov chain Monte Carlo (MCMC). The result is smoother predictions that are ...

متن کامل

Bagging Constrained Equity Premium Predictors∗

The literature on excess return prediction has considered a wide array of estimation schemes, among them unrestricted and restricted regression coefficients. We consider bootstrap aggregation (bagging) to smooth parameter restrictions. Two types of restrictions are considered: positivity of the regression coefficient and positivity of the forecast. Bagging constrained estimators can have smalle...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003